Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 15.573
Filtrar
1.
PLoS One ; 19(4): e0301896, 2024.
Artículo en Inglés | MEDLINE | ID: mdl-38598520

RESUMEN

This study investigates whether humans recognize different emotions conveyed only by the kinematics of a single moving geometrical shape and how this competence unfolds during development, from childhood to adulthood. To this aim, animations in which a shape moved according to happy, fearful, or neutral cartoons were shown, in a forced-choice paradigm, to 7- and 10-year-old children and adults. Accuracy and response times were recorded, and the movement of the mouse while the participants selected a response was tracked. Results showed that 10-year-old children and adults recognize happiness and fear when conveyed solely by different kinematics, with an advantage for fearful stimuli. Fearful stimuli were also accurately identified at 7-year-olds, together with neutral stimuli, while, at this age, the accuracy for happiness was not significantly different than chance. Overall, results demonstrates that emotions can be identified by a single point motion alone during both childhood and adulthood. Moreover, motion contributes in various measures to the comprehension of emotions, with fear recognized earlier in development and more readily even later on, when all emotions are accurately labeled.


Asunto(s)
Emociones , Expresión Facial , Adulto , Niño , Humanos , Fenómenos Biomecánicos , Emociones/fisiología , Miedo , Felicidad
2.
Cereb Cortex ; 34(4)2024 Apr 01.
Artículo en Inglés | MEDLINE | ID: mdl-38566513

RESUMEN

The perception of facial expression plays a crucial role in social communication, and it is known to be influenced by various facial cues. Previous studies have reported both positive and negative biases toward overweight individuals. It is unclear whether facial cues, such as facial weight, bias facial expression perception. Combining psychophysics and event-related potential technology, the current study adopted a cross-adaptation paradigm to examine this issue. The psychophysical results of Experiments 1A and 1B revealed a bidirectional cross-adaptation effect between overweight and angry faces. Adapting to overweight faces decreased the likelihood of perceiving ambiguous emotional expressions as angry compared to adapting to normal-weight faces. Likewise, exposure to angry faces subsequently caused normal-weight faces to appear thinner. These findings were corroborated by bidirectional event-related potential results, showing that adaptation to overweight faces relative to normal-weight faces modulated the event-related potential responses of emotionally ambiguous facial expression (Experiment 2A); vice versa, adaptation to angry faces relative to neutral faces modulated the event-related potential responses of ambiguous faces in facial weight (Experiment 2B). Our study provides direct evidence associating overweight faces with facial expression, suggesting at least partly common neural substrates for the perception of overweight and angry faces.


Asunto(s)
Expresión Facial , Prejuicio de Peso , Humanos , Sobrepeso , Ira/fisiología , Potenciales Evocados/fisiología , Emociones/fisiología
3.
Hum Brain Mapp ; 45(5): e26673, 2024 Apr.
Artículo en Inglés | MEDLINE | ID: mdl-38590248

RESUMEN

The amygdala is important for human fear processing. However, recent research has failed to reveal specificity, with evidence that the amygdala also responds to other emotions. A more nuanced understanding of the amygdala's role in emotion processing, particularly relating to fear, is needed given the importance of effective emotional functioning for everyday function and mental health. We studied 86 healthy participants (44 females), aged 18-49 (mean 26.12 ± 6.6) years, who underwent multiband functional magnetic resonance imaging. We specifically examined the reactivity of four amygdala subregions (using regions of interest analysis) and related brain connectivity networks (using generalized psycho-physiological interaction) to fear, angry, and happy facial stimuli using an emotional face-matching task. All amygdala subregions responded to all stimuli (p-FDR < .05), with this reactivity strongly driven by the superficial and centromedial amygdala (p-FDR < .001). Yet amygdala subregions selectively showed strong functional connectivity with other occipitotemporal and inferior frontal brain regions with particular sensitivity to fear recognition and strongly driven by the basolateral amygdala (p-FDR < .05). These findings suggest that amygdala specialization to fear may not be reflected in its local activity but in its connectivity with other brain regions within a specific face-processing network.


Asunto(s)
Encéfalo , Emociones , Femenino , Humanos , Emociones/fisiología , Miedo/psicología , Amígdala del Cerebelo/fisiología , Felicidad , Mapeo Encefálico/métodos , Imagen por Resonancia Magnética , Expresión Facial
4.
PLoS One ; 19(4): e0290590, 2024.
Artículo en Inglés | MEDLINE | ID: mdl-38635525

RESUMEN

Spontaneous smiles in response to politicians can serve as an implicit barometer for gauging electorate preferences. However, it is unclear whether a subtle Duchenne smile-an authentic expression involving the coactivation of the zygomaticus major (ZM) and orbicularis oculi (OO) muscles-would be elicited while reading about a favored politician smiling, indicating a more positive disposition and political endorsement. From an embodied simulation perspective, we investigated whether written descriptions of a politician's smile would trigger morphologically different smiles in readers depending on shared or opposing political orientation. In a controlled reading task in the laboratory, participants were presented with subject-verb phrases describing left and right-wing politicians smiling or frowning. Concurrently, their facial muscular reactions were measured via electromyography (EMG) recording at three facial muscles: the ZM and OO, coactive during Duchenne smiles, and the corrugator supercilii (CS) involved in frowning. We found that participants responded with a Duchenne smile detected at the ZM and OO facial muscles when exposed to portrayals of smiling politicians of same political orientation and reported more positive emotions towards these latter. In contrast, when reading about outgroup politicians smiling, there was a weaker activation of the ZM muscle and no activation of the OO muscle, suggesting a weak non-Duchenne smile, while emotions reported towards outgroup politicians were significantly more negative. Also, a more enhanced frown response in the CS was found for ingroup compared to outgroup politicians' frown expressions. Present findings suggest that a politician's smile may go a long way to influence electorates through both non-verbal and verbal pathways. They add another layer to our understanding of how language and social information shape embodied effects in a highly nuanced manner. Implications for verbal communication in the political context are discussed.


Asunto(s)
Fragilidad , Sonrisa , Humanos , Sonrisa/fisiología , Lectura , Expresión Facial , Emociones/fisiología , Músculos Faciales/fisiología , Párpados
5.
Neuroimage ; 290: 120578, 2024 Apr 15.
Artículo en Inglés | MEDLINE | ID: mdl-38499051

RESUMEN

Face perception is a complex process that involves highly specialized procedures and mechanisms. Investigating into face perception can help us better understand how the brain processes fine-grained, multidimensional information. This research aimed to delve deeply into how different dimensions of facial information are represented in specific brain regions or through inter-regional connections via an implicit face recognition task. To capture the representation of various facial information in the brain, we employed support vector machine decoding, functional connectivity, and model-based representational similarity analysis on fMRI data, resulting in the identification of three crucial findings. Firstly, despite the implicit nature of the task, emotions were still represented in the brain, contrasting with all other facial information. Secondly, the connection between the medial amygdala and the parahippocampal gyrus was found to be essential for the representation of facial emotion in implicit tasks. Thirdly, in implicit tasks, arousal representation occurred in the parahippocampal gyrus, while valence depended on the connection between the primary visual cortex and the parahippocampal gyrus. In conclusion, these findings dissociate the neural mechanisms of emotional valence and arousal, revealing the precise spatial patterns of multidimensional information processing in faces.


Asunto(s)
Emociones , Imagen por Resonancia Magnética , Humanos , Encéfalo/diagnóstico por imagen , Mapeo Encefálico/métodos , Giro Parahipocampal/diagnóstico por imagen , Expresión Facial
6.
Sensors (Basel) ; 24(5)2024 Feb 27.
Artículo en Inglés | MEDLINE | ID: mdl-38475072

RESUMEN

Understanding the association between subjective emotional experiences and physiological signals is of practical and theoretical significance. Previous psychophysiological studies have shown a linear relationship between dynamic emotional valence experiences and facial electromyography (EMG) activities. However, whether and how subjective emotional valence dynamics relate to facial EMG changes nonlinearly remains unknown. To investigate this issue, we re-analyzed the data of two previous studies that measured dynamic valence ratings and facial EMG of the corrugator supercilii and zygomatic major muscles from 50 participants who viewed emotional film clips. We employed multilinear regression analyses and two nonlinear machine learning (ML) models: random forest and long short-term memory. In cross-validation, these ML models outperformed linear regression in terms of the mean squared error and correlation coefficient. Interpretation of the random forest model using the SHapley Additive exPlanation tool revealed nonlinear and interactive associations between several EMG features and subjective valence dynamics. These findings suggest that nonlinear ML models can better fit the relationship between subjective emotional valence dynamics and facial EMG than conventional linear models and highlight a nonlinear and complex relationship. The findings encourage emotion sensing using facial EMG and offer insight into the subjective-physiological association.


Asunto(s)
Emociones , Expresión Facial , Humanos , Electromiografía , Emociones/fisiología , Cara , Músculos Faciales/fisiología , Aprendizaje Automático
7.
J Headache Pain ; 25(1): 33, 2024 Mar 11.
Artículo en Inglés | MEDLINE | ID: mdl-38462615

RESUMEN

BACKGROUND: The present study used the Facial Action Coding System (FACS) to analyse changes in facial activities in individuals with migraine during resting conditions to determine the potential of facial expressions to convey information about pain during headache episodes. METHODS: Facial activity was recorded in calm and resting conditions by using a camera for both healthy controls (HC) and patients with episodic migraine (EM) and chronic migraine (CM). The FACS was employed to analyse the collected facial images, and intensity scores for each of the 20 action units (AUs) representing expressions were generated. The groups and headache pain conditions were then examined for each AU. RESULTS: The study involved 304 participants, that is, 46 HCs, 174 patients with EM, and 84 patients with CM. Elevated headache pain levels were associated with increased lid tightener activity and reduced mouth stretch. In the CM group, moderate to severe headache attacks exhibited decreased activation in the mouth stretch, alongside increased activation in the lid tightener, nose wrinkle, and cheek raiser, compared to mild headache attacks (all corrected p < 0.05). Notably, lid tightener activation was positively correlated with the Numeric Rating Scale (NRS) level of headache (p = 0.012). Moreover, the lip corner depressor was identified to be indicative of emotional depression severity (p < 0.001). CONCLUSION: Facial expressions, particularly lid tightener actions, served as inherent indicators of headache intensity in individuals with migraine, even during resting conditions. This indicates that the proposed approach holds promise for providing a subjective evaluation of headaches, offering the benefits of real-time assessment and convenience for patients with migraine.


Asunto(s)
Expresión Facial , Trastornos Migrañosos , Humanos , Trastornos Migrañosos/complicaciones , Cefalea , Dolor , Depresión
8.
IEEE Trans Image Process ; 33: 2293-2304, 2024.
Artículo en Inglés | MEDLINE | ID: mdl-38470591

RESUMEN

Human emotions contain both basic and compound facial expressions. In many practical scenarios, it is difficult to access all the compound expression categories at one time. In this paper, we investigate comprehensive facial expression recognition (FER) in the class-incremental learning paradigm, where we define well-studied and easily-accessible basic expressions as initial classes and learn new compound expressions incrementally. To alleviate the stability-plasticity dilemma in our incremental task, we propose a novel Relationship-Guided Knowledge Transfer (RGKT) method for class-incremental FER. Specifically, we develop a multi-region feature learning (MFL) module to extract fine-grained features for capturing subtle differences in expressions. Based on the MFL module, we further design a basic expression-oriented knowledge transfer (BET) module and a compound expression-oriented knowledge transfer (CET) module, by effectively exploiting the relationship across expressions. The BET module initializes the new compound expression classifiers based on expression relevance between basic and compound expressions, improving the plasticity of our model to learn new classes. The CET module transfers expression-generic knowledge learned from new compound expressions to enrich the feature set of old expressions, facilitating the stability of our model against forgetting old classes. Extensive experiments on three facial expression databases show that our method achieves superior performance in comparison with several state-of-the-art methods.


Asunto(s)
Reconocimiento Facial , Humanos , Emociones , Aprendizaje , Expresión Facial , Bases de Datos Factuales
9.
Nat Commun ; 15(1): 2443, 2024 Mar 19.
Artículo en Inglés | MEDLINE | ID: mdl-38499519

RESUMEN

The ability to make nuanced inferences about other people's emotional states is central to social functioning. While emotion inferences can be sensitive to both facial movements and the situational context that they occur in, relatively little is understood about when these two sources of information are integrated across emotion categories and individuals. In a series of studies, we use one archival and five empirical datasets to demonstrate that people could be integrating, but that emotion inferences are just as well (and sometimes better) captured by knowledge of the situation alone, while isolated facial cues are insufficient. Further, people integrate facial cues more for categories for which they most frequently encounter facial expressions in everyday life (e.g., happiness). People are also moderately stable over time in their reliance on situational cues and integration of cues and those who reliably utilize situation cues more also have better situated emotion knowledge. These findings underscore the importance of studying variability in reliance on and integration of cues.


Asunto(s)
Emociones , Felicidad , Humanos , Expresión Facial , Movimiento , Señales (Psicología)
10.
Cereb Cortex ; 34(3)2024 03 01.
Artículo en Inglés | MEDLINE | ID: mdl-38466112

RESUMEN

Alexithymia is characterized by difficulties in emotional information processing. However, the underlying reasons for emotional processing deficits in alexithymia are not fully understood. The present study aimed to investigate the mechanism underlying emotional deficits in alexithymia. Using the Toronto Alexithymia Scale-20, we recruited college students with high alexithymia (n = 24) or low alexithymia (n = 24) in this study. Participants judged the emotional consistency of facial expressions and contextual sentences while recording their event-related potentials. Behaviorally, the high alexithymia group showed longer response times versus the low alexithymia group in processing facial expressions. The event-related potential results showed that the high alexithymia group had more negative-going N400 amplitudes compared with the low alexithymia group in the incongruent condition. More negative N400 amplitudes are also associated with slower responses to facial expressions. Furthermore, machine learning analyses based on N400 amplitudes could distinguish the high alexithymia group from the low alexithymia group in the incongruent condition. Overall, these findings suggest worse facial emotion perception for the high alexithymia group, potentially due to difficulty in spontaneously activating emotion concepts. Our findings have important implications for the affective science and clinical intervention of alexithymia-related affective disorders.


Asunto(s)
Síntomas Afectivos , Electroencefalografía , Humanos , Femenino , Masculino , Expresión Facial , Potenciales Evocados , Emociones
11.
PLoS One ; 19(3): e0300973, 2024.
Artículo en Inglés | MEDLINE | ID: mdl-38512901

RESUMEN

OBJECTIVE: Most previous studies have examined emotion recognition in autism spectrum condition (ASC) without intellectual disability (ID). However, ASC and ID co-occur to a high degree. The main aims of the study were to examine emotion recognition in individuals with ASC and co-occurring intellectual disability (ASC-ID) as compared to individuals with ID alone, and to investigate the relationship between emotion recognition and social functioning. METHODS: The sample consisted of 30 adult participants with ASC-ID and a comparison group of 29 participants with ID. Emotion recognition was assessed by the facial emotions test, while. social functioning was assessed by the social responsiveness scale-second edition (SRS-2). RESULTS: The accuracy of emotion recognition was significantly lower in individuals with ASC-ID compared to the control group with ID, especially when it came to identifying angry and fearful emotions. Participants with ASC-ID exhibited more pronounced difficulties in social functioning compared to those with ID, and there was a significant negative correlation between emotion recognition and social functioning. However, emotion recognition accounted for only 8% of the variability observed in social functioning. CONCLUSION: Our data indicate severe difficulties in the social-perceptual domain and in everyday social functioning in individuals with ASC-ID.


Asunto(s)
Trastorno del Espectro Autista , Trastorno Autístico , Reconocimiento Facial , Discapacidad Intelectual , Adulto , Humanos , Trastorno Autístico/psicología , Interacción Social , Discapacidad Intelectual/psicología , Emociones , Trastorno del Espectro Autista/psicología , Expresión Facial
12.
Neuroimage Clin ; 41: 103586, 2024.
Artículo en Inglés | MEDLINE | ID: mdl-38428325

RESUMEN

BACKGROUND: Emotion processing deficits are known to accompany depressive symptoms and are often seen in stroke patients. Little is known about the influence of post-stroke depressive (PSD) symptoms and specific brain lesions on altered emotion processing abilities and how these phenomena develop over time. This potential relationship may impact post-stroke rehabilitation of neurological and psychosocial function. To address this scientific gap, we investigated the relationship between PSD symptoms and emotion processing abilities in a longitudinal study design from the first days post-stroke into the early chronic phase. METHODS: Twenty-six ischemic stroke patients performed an emotion processing task on videos with emotional faces ('happy,' 'sad,' 'anger,' 'fear,' and 'neutral') at different intensity levels (20%, 40%, 60%, 80%, 100%). Recognition accuracies and response times were measured, as well as scores of depressive symptoms (Montgomery-Åsberg Depression Rating Scale). Twenty-eight healthy participants matched in age and sex were included as a control group. Whole-brain support-vector regression lesion-symptom mapping (SVR-LSM) analyses were performed to investigate whether specific lesion locations were associated with the recognition accuracy of specific emotion categories. RESULTS: Stroke patients performed worse in overall recognition accuracy compared to controls, specifically in the recognition of happy, sad, and fearful faces. Notably, more depressed stroke patients showed an increased processing towards specific negative emotions, as they responded significantly faster to angry faces and recognized sad faces of low intensities significantly more accurately. These effects obtained for the first days after stroke partly persisted to follow-up assessment several months later. SVR-LSM analyses revealed that inferior and middle frontal regions (IFG/MFG) and insula and putamen were associated with emotion-recognition deficits in stroke. Specifically, recognizing happy facial expressions was influenced by lesions affecting the anterior insula, putamen, IFG, MFG, orbitofrontal cortex, and rolandic operculum. Lesions in the posterior insula, rolandic operculum, and MFG were also related to reduced recognition accuracy of fearful facial expressions, whereas recognition deficits of sad faces were associated with frontal pole, IFG, and MFG damage. CONCLUSION: PSD symptoms facilitate processing negative emotional stimuli, specifically angry and sad facial expressions. The recognition accuracy of different emotional categories was linked to brain lesions in emotion-related processing circuits, including insula, basal ganglia, IFG, and MFG. In summary, our study provides support for psychosocial and neural factors underlying emotional processing after stroke, contributing to the pathophysiology of PSD.


Asunto(s)
Depresión , Reconocimiento Facial , Humanos , Estudios Longitudinales , Emociones/fisiología , Ira , Encéfalo/diagnóstico por imagen , Expresión Facial , Reconocimiento Facial/fisiología
13.
BMC Psychiatry ; 24(1): 226, 2024 Mar 26.
Artículo en Inglés | MEDLINE | ID: mdl-38532335

RESUMEN

BACKGROUND: Patients with schizophrenia (SCZ) exhibit difficulties deficits in recognizing facial expressions with unambiguous valence. However, only a limited number of studies have examined how these patients fare in interpreting facial expressions with ambiguous valence (for example, surprise). Thus, we aimed to explore the influence of emotional background information on the recognition of ambiguous facial expressions in SCZ. METHODS: A 3 (emotion: negative, neutral, and positive) × 2 (group: healthy controls and SCZ) experimental design was adopted in the present study. The experimental materials consisted of 36 images of negative emotions, 36 images of neutral emotions, 36 images of positive emotions, and 36 images of surprised facial expressions. In each trial, a briefly presented surprised face was preceded by an affective image. Participants (36 SCZ and 36 healthy controls (HC)) were required to rate their emotional experience induced by the surprised facial expressions. Participants' emotional experience was measured using the 9-point rating scale. The experimental data have been analyzed by conducting analyses of variances (ANOVAs) and correlation analysis. RESULTS: First, the SCZ group reported a more positive emotional experience under the positive cued condition compared to the negative cued condition. Meanwhile, the HC group reported the strongest positive emotional experience in the positive cued condition, a moderate experience in the neutral cued condition, and the weakest in the negative cue condition. Second, the SCZ (vs. HC) group showed longer reaction times (RTs) for recognizing surprised facial expressions. The severity of schizophrenia symptoms in the SCZ group was negatively correlated with their rating scores for emotional experience under neutral and positive cued condition. CONCLUSIONS: Recognition of surprised facial expressions was influenced by background information in both SCZ and HC, and the negative symptoms in SCZ. The present study indicates that the role of background information should be fully considered when examining the ability of SCZ to recognize ambiguous facial expressions.


Asunto(s)
Reconocimiento Facial , Esquizofrenia , Humanos , Emociones , Reconocimiento en Psicología , Expresión Facial , China
14.
Sci Rep ; 14(1): 5574, 2024 03 06.
Artículo en Inglés | MEDLINE | ID: mdl-38448642

RESUMEN

Seeing an angry individual in close physical proximity can not only result in a larger retinal representation of that individual and an enhanced resolution of emotional cues, but may also increase motivation for rapid visual processing and action preparation. The present study investigated the effects of stimulus size and emotional expression on the perception of happy, angry, non-expressive, and scrambled faces. We analyzed event-related potentials (ERPs) and behavioral responses of N = 40 participants who performed a naturalness classification task on real and artificially created facial expressions. While the emotion-related effects on accuracy for recognizing authentic expressions were modulated by stimulus size, ERPs showed only additive effects of stimulus size and emotional expression, with no significant interaction with size. This contrasts with previous research on emotional scenes and words. Effects of size were present in all included ERPs, whereas emotional expressions affected the N170, EPN, and LPC, irrespective of size. These results imply that the decoding of emotional valence in faces can occur even for small stimuli. Supra-additive effects in faces may necessitate larger size ranges or dynamic stimuli that increase arousal.


Asunto(s)
Emociones , Expresión Facial , Humanos , Examen Físico , Ira , Percepción Visual
15.
IEEE Trans Vis Comput Graph ; 30(5): 2206-2216, 2024 May.
Artículo en Inglés | MEDLINE | ID: mdl-38437082

RESUMEN

In Mixed Reality (MR), users' heads are largely (if not completely) occluded by the MR Head-Mounted Display (HMD) they are wearing. As a consequence, one cannot see their facial expressions and other communication cues when interacting locally. In this paper, we investigate how displaying virtual avatars' heads on-top of the (HMD-occluded) heads of participants in a Video See-Through (VST) Mixed Reality local collaborative task could improve their collaboration as well as social presence. We hypothesized that virtual heads would convey more communicative cues (such as eye direction or facial expressions) hidden by the MR HMDs and lead to better collaboration and social presence. To do so, we conducted a between-subject study ($\mathrm{n}=88$) with two independent variables: the type of avatar (CartoonAvatar/RealisticAvatar/NoAvatar) and the level of facial expressions provided (HighExpr/LowExpr). The experiment involved two dyadic communication tasks: (i) the "20-question" game where one participant asks questions to guess a hidden word known by the other participant and (ii) a urban planning problem where participants have to solve a puzzle by collaborating. Each pair of participants performed both tasks using a specific type of avatar and facial animation. Our results indicate that while adding an avatar's head does not necessarily improve social presence, the amount of facial expressions provided through the social interaction does have an impact. Moreover, participants rated their performance higher when observing a realistic avatar but rated the cartoon avatars as less uncanny. Taken together, our results contribute to a better understanding of the role of partial avatars in local MR collaboration and pave the way for further research exploring collaboration in different scenarios, with different avatar types or MR setups.


Asunto(s)
Realidad Aumentada , 60453 , Humanos , Interfaz Usuario-Computador , Gráficos por Computador , Expresión Facial
16.
Biol Psychol ; 187: 108771, 2024 Mar.
Artículo en Inglés | MEDLINE | ID: mdl-38460756

RESUMEN

The ability to detect and recognize facial emotions emerges in childhood and is important for understanding social cues, but we know relatively little about how individual differences in temperament may influence early emotional face processing. We used a sample of 419 children (Mage = 10.57 years, SD = 1.75; 48% female; 77% White) to examine the relation between temperamental shyness and early stages of emotional face processing (assessed using the P100 and N170 event-related potentials) during different facial expressions (neutral, anger, fear, and happy). We found that higher temperamental shyness was related to greater P100 activation to faces expressing anger and fear relative to neutral faces. Further, lower temperamental shyness was related to greater N170 activation to faces expressing anger and fear relative to neutral faces. There were no relations between temperamental shyness and neural activation to happy faces relative to neutral faces for P100 or N170, suggesting specificity to faces signaling threat. We discuss findings in the context of understanding the early processing of facial emotional display of threat among shy children.


Asunto(s)
Reconocimiento Facial , Timidez , Niño , Humanos , Femenino , Masculino , Reconocimiento Facial/fisiología , Emociones/fisiología , Potenciales Evocados/fisiología , Ira , Expresión Facial , Electroencefalografía
17.
Biol Psychol ; 187: 108774, 2024 Mar.
Artículo en Inglés | MEDLINE | ID: mdl-38471619

RESUMEN

There has been disagreement regarding the relationship among the three components (subjective experience, external performance, and physiological response) of emotional responses. To investigate this issue further, this study compared the effects of active and passive suppression of facial expressions on subjective experiences and event-related potentials (ERPs) through two experiments. The two methods of expression suppression produced opposite patterns of ERPs for negative emotional stimuli: compared with the free-viewing condition, active suppression of expression decreased, while passive suppression increased the amplitude of the late positive potential (LPP) when viewing negative emotional stimuli. Further, while active suppression had no effect on participants' emotional experience, passive suppression enhanced their emotional experience. Among the three components of emotional responses, facial expressions are more closely related to the physiological response of the brain than to subjective experience, and whether the suppression was initiated by participants determines the decrease or increase in physiological response of the brain (i.e. LPP). The findings revealed the important role of individual subjective initiative in modulating the relationship among the components of emotional response, which provides new insights into effectively emotional regulation.


Asunto(s)
Regulación Emocional , Expresión Facial , Humanos , Potenciales Evocados/fisiología , Emociones/fisiología , Encéfalo/fisiología , Electroencefalografía
18.
Cortex ; 174: 93-109, 2024 May.
Artículo en Inglés | MEDLINE | ID: mdl-38493568

RESUMEN

Contrary to the extensive research on processing subliminal and/or unattended emotional facial expressions, only a minority of studies have investigated the neural correlates of consciousness (NCCs) of emotions conveyed by faces. In the present high-density electroencephalography (EEG) study, we first employed a staircase procedure to identify each participant's perceptual threshold of the emotion expressed by the face and then compared the EEG signals elicited in trials where the participants were aware with the activity elicited in trials where participants were unaware of the emotions expressed by these, otherwise identical, faces. Drawing on existing knowledge of the neural mechanisms of face processing and NCCs, we hypothesized that activity in frontal electrodes would be modulated in relation to participants' awareness of facial emotional content. More specifically, we hypothesized that the NCC of fear seen on someone else's face could be detected as a modulation of a later and more anterior (i.e., at frontal sites) event-related potential (ERP) than the face-sensitive N170. By adopting a data-driven approach and cluster-based statistics to the analysis of EEG signals, the results were clear-cut in showing that visual awareness of fear was associated with the modulation of a frontal ERP component in a 150-300 msec interval. These insights are dissected and contextualized in relation to prevailing theories of visual consciousness and their proposed NCC benchmarks.


Asunto(s)
Estado de Conciencia , Reconocimiento Facial , Humanos , Electroencefalografía , Miedo , Emociones , Potenciales Evocados , Expresión Facial
19.
Psychol Sci ; 35(4): 405-414, 2024 Apr.
Artículo en Inglés | MEDLINE | ID: mdl-38489402

RESUMEN

Ethnic out-group members are disproportionately more often the victim of misidentifications. The so-called other-race effect (ORE), the tendency to better remember faces of individuals belonging to one's own ethnic in-group than faces belonging to an ethnic out-group, has been identified as one causal ingredient in such tragic incidents. Investigating an important aspect for the ORE-that is, emotional expression-the seminal study by Ackerman and colleagues (2006) found that White participants remembered neutral White faces better than neutral Black faces, but crucially, Black angry faces were better remembered than White angry faces (i.e., a reversed ORE). In the current study, we sought to replicate this study and directly tackle the potential causes for different results with later work. Three hundred ninety-six adult White U.S. citizens completed our study in which we manipulated the kind of employed stimuli (as in the original study vs. more standardized ones) whether participants knew of the recognition task already at the encoding phase. Additionally, participants were asked about the unusualness of the presented faces. We were able to replicate results from the Ackerman et al. (2006) study with the original stimuli but not with more standardized stimuli.


Asunto(s)
Ira , Recuerdo Mental , Adulto , Humanos , Reconocimiento en Psicología , Etnicidad , Expresión Facial
20.
Autism Res ; 17(4): 824-837, 2024 Apr.
Artículo en Inglés | MEDLINE | ID: mdl-38488319

RESUMEN

Cumulating evidence suggests that atypical emotion processing in autism may generalize across different stimulus domains. However, this evidence comes from studies examining explicit emotion recognition. It remains unclear whether domain-general atypicality also applies to implicit emotion processing in autism and its implication for real-world social communication. To investigate this, we employed a novel cross-modal emotional priming task to assess implicit emotion processing of spoken/sung words (primes) through their influence on subsequent emotional judgment of faces/face-like objects (targets). We assessed whether implicit emotional priming differed between 38 autistic and 38 neurotypical individuals across age groups as a function of prime and target type. Results indicated no overall group differences across age groups, prime types, and target types. However, differential, domain-specific developmental patterns emerged for the autism and neurotypical groups. For neurotypical individuals, speech but not song primed the emotional judgment of faces across ages. This speech-orienting tendency was not observed across ages in the autism group, as priming of speech on faces was not seen in autistic adults. These results outline the importance of the delicate weighting between speech- versus song-orientation in implicit emotion processing throughout development, providing more nuanced insights into the emotion processing profile of autistic individuals.


Asunto(s)
Trastorno del Espectro Autista , Trastorno Autístico , Adulto , Humanos , Expresión Facial , Emociones , Trastorno Autístico/psicología , Juicio
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA
...